# Multilingual Knowledge Distillation
Yixin Distill Qwen 72B
Apache-2.0
A high-performance distilled model optimized for mathematics and general reasoning, refined from Qwen2.5-72B through reinforcement learning
Large Language Model
Safetensors Supports Multiple Languages
Y
YiXin-AILab
38
26
Shlm Grc En
MIT
This model creates sentence embeddings for Ancient Greek and English texts in a shared vector space, based on an improved HLM architecture and trained through multilingual knowledge distillation.
Text Embedding
Transformers Supports Multiple Languages

S
kevinkrahn
62
2
Featured Recommended AI Models